Randomized interpolative decomposition of separated representations
نویسندگان
چکیده
We introduce an algorithm to compute tensor Interpolative Decomposition (tensor ID) for the reduction of the separation rank of Canonical Tensor Decompositions (CTDs). Tensor ID selects, for a user-defined accuracy ǫ, a near optimal subset of terms of a CTD to represent the remaining terms via a linear combination of the selected terms. Tensor ID can be used as an alternative to or in combination with the Alternating Least Squares (ALS) algorithm. We present examples of its use within a convergent iteration to compute inverse operators in high dimensions. We also briefly discuss the spectral norm as a computational alternative to the Frobenius norm in estimating approximation errors of tensor ID. We reduce the problem of finding tensor IDs to that of constructing Interpolative Decompositions of certain matrices. These matrices are generated via randomized projection of the terms of the given tensor. We provide cost estimates and several examples of the new approach to the reduction of separation rank.
منابع مشابه
RSVDPACK: An implementation of randomized algorithms for computing the singular value, interpolative, and CUR decompositions of matrices on multi-core and GPU architectures
RSVDPACK is a library of functions for computing low rank approximations of matrices. The library includes functions for computing standard (partial) factorizations such as the Singular Value Decomposition (SVD), and also so called “structure preserving” factorizations such as the Interpolative Decomposition (ID) and the CUR decomposition. The ID and CUR factorizations pick subsets of the rows/...
متن کاملParallel Implementation of Fast Randomized Algorithms for Low Rank Matrix Decomposition
We analyze the parallel performance of randomized interpolative decomposition by decomposing low rank complex-valued Gaussian random matrices up to 64 GB. We chose a Cray XMT supercomputer as it provides an almost ideal PRAM model permitting quick investigation of parallel algorithms without obfuscation from hardware idiosyncrasies. We obtain that on non-square matrices performance becomes very...
متن کاملLiterature survey on low rank approximation of matrices
Low rank approximation of matrices has been well studied in literature. Singular value decomposition , QR decomposition with column pivoting, rank revealing QR factorization (RRQR), Interpolative decomposition etc are classical deterministic algorithms for low rank approximation. But these techniques are very expensive (O(n 3) operations are required for n × n matrices). There are several rando...
متن کاملFar-Field Compression for Fast Kernel Summation Methods in High Dimensions
We consider fast kernel summations in high dimensions: given a large set of points in d dimensions (with d 3) and a pair-potential function (the kernel function), we compute a weighted sum of all pairwise kernel interactions for each point in the set. Direct summation is equivalent to a (dense) matrix-vector multiplication and scales quadratically with the number of points. Fast kernel summatio...
متن کاملA CUR Factorization Algorithm based on the Interpolative Decomposition
An algorithm for the efficient computation of the CUR decomposition is presented. The method is based on simple modifications to the classical truncated pivoted QR decomposition, which means that highly optimized library codes can be utilized for its implementation. Numerical experiments demonstrate advantageous performance compared to existing techniques for computing CUR factorizations.
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- J. Comput. Physics
دوره 281 شماره
صفحات -
تاریخ انتشار 2015